Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization

نویسندگان

  • Ángel F. García-Fernández
  • Ba-Ngu Vo
چکیده

In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimisations after the prediction and update steps. We perform the KLD minimisations directly on the multitarget prediction and posterior densities.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Unscented Auxiliary Particle Filter Implementation of the Cardinalized Probability Hypothesis Density Filters

The probability hypothesis density (PHD) filter suffers from lack of precise estimation of the expected number of targets. The Cardinalized PHD (CPHD) recursion, as a generalization of the PHD recursion, remedies this flaw and simultaneously propagates the intensity function and the posterior cardinality distribution. While there are a few new approaches to enhance the Sequential Monte Carlo (S...

متن کامل

Simple Entropic Derivation of a Generalized Black-Scholes Option Pricing Model

A straightforward derivation of the celebrated Black-Scholes Option Pricing model is obtained by solution of a simple constrained minimization of relative entropy. The derivation leads to a natural generalization of it, which is consistent with some evidence from stock index option markets.

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 63  شماره 

صفحات  -

تاریخ انتشار 2015